Example of prediction DAST score's class in different method to study and interpretation the features effect on DAST.

DAST

case (1):

DAST

1) Decision Tree

GINI

Entropy

Pruned

2) Random Forest

GINI

Entropy

Pruned

3) Gradient Boosting Classifier

Pruned

4) Gaussian Naive Bayes

5) SVM

6) Neural Network

7) Lightgbm Model Parameters and Training

very important note for tree methods

We will show that the impurity-based feature importance can inflate the importance of numerical features.

Furthermore, the impurity-based feature importance of trees suffers from being computed on statistics derived from the training dataset: the importances can be high even for features that are not predictive of the target variable, as long as the model has the capacity to use them to overfit.

8) XGBoost

9) convolutional neural network 1D

with/ without standardize the data

Number of Filters

Size of Kernel

Multi-Headed Convolutional Neural Network

10) Graph neural network